On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems
نویسندگان
چکیده
It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce the same iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We discuss more precisely the conditions on the update matrix that give rise to this behavior, and show that the crucial fact is that the components of each update matrix is choosen in the last two dimensions of the Krylov subspaces defined by the conjugate gradient method. In the framework based on a sufficient condition to obtain mutually conjugate search directions, we show that the one-parameter Broyden family is complete. We also show that the update matrices from the one-parameter Broyden family is almost always well-defined on a quadratic problem with positivedefinite Hessian. The only exception is when the symmetric rank-one update is used and the unit steplength is taken in the same iteration, in this case it is the Broyden parameter that becomes undefined.
منابع مشابه
Quasi-Newton Methods for Image Restoration
Many iterative methods that are used to solve Ax = b can be derived as quasi-Newton methods for minimizing the quadratic function 1 2 xAAx−xAb. In this paper, several such methods are considered, including conjugate gradient least squares (CGLS), Barzilai-Borwein (BB), residual norm steepest descent (RNSD) and Landweber (LW). Regularization properties of these methods are studied by analyzing t...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملTwo Settings of the Dai-Liao Parameter Based on Modified Secant Equations
Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods, we introduce two new parameters based on the modified secant equation proposed by Li et al. (Comput. Optim. Appl. 202:523-539, 2007) with two approaches, which use an extended new conjugacy condition. The first is based on a modified descent three-term search direction, as the descent Hest...
متن کاملNewton-Like Methods for Sparse Inverse Covariance Estimation
We propose two classes of second-order optimization methods for solving the sparse inverse covariance estimation problem. The first approach, which we call the Newton-LASSO method, minimizes a piecewise quadratic model of the objective function at every iteration to generate a step. We employ the fast iterative shrinkage thresholding method (FISTA) to solve this subproblem. The second approach,...
متن کاملA Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we pres...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Comp. Opt. and Appl.
دوره 60 شماره
صفحات -
تاریخ انتشار 2015